Direct Sum

Definition: Let VV be a vector space and let M1,M2,…,MkM_1, M_2, \ldots, M_k be subspaces of VV. The sum of the subspaces MM is defined as

M={m=m1+m2+…+mk∣mi∈Mi,i=1,2,…,k} M = \{m = m_1 + m_2 + \ldots + m_k \mid m_i \in M_i, i=1,2,\ldots,k\}

Theorem: The sum of subspaces MM is a subspace of VV.

Proof:
Let x, y ∈M\in M Then,
x = m1+m2+…+mkm_1 + m_2 + \ldots + m_k
y = m1ˉ+m2ˉ+…+mkˉ\bar{m_1} + \bar{m_2} + \ldots + \bar{m_k}

αx+βy=α(m1+m2+…+mk)+β(m1ˉ+m2ˉ+…+mkˉ)\alpha x + \beta y = \alpha(m_1 + m_2 + \ldots + m_k) + \beta(\bar{m_1} + \bar{m_2} + \ldots + \bar{m_k})
αx+βy=(αm1+βm1ˉ)+(αm2+βm2ˉ)+…+(αmk+βmkˉ)\alpha x + \beta y = (\alpha m_1 + \beta \bar{m_1}) + (\alpha m_2 + \beta \bar{m_2}) + \ldots + (\alpha m_k + \beta \bar{m_k})

Since αmi+βmiˉ∈Mi\alpha m_i + \beta \bar{m_i} \in M_i for i=1,2,…,ki=1,2,\ldots,k, we have αx+βy∈M\alpha x + \beta y \in M.

Remark: Let V=M1+M2+…+MkV=M_1+M_2+\ldots+M_k and let M1,M2…,MkM_1, M_2 \ldots, M_k are linearly independent. Let x∈Vx \in V

x=m1+m2+…+mk x = m_1 + m_2 + \ldots + m_k
where mi∈Mi for i=1,2,…,k\text{where $m_i \in M_i$ for $i=1,2,\ldots,k$}
m1+m2+…+mk=0  ⟹  m1=m2=…=mk=0 m_1 + m_2 + \ldots + m_k = 0 \implies m_1 = m_2 = \ldots = m_k = 0
since M1,M2…,Mk are linearly independent\text{since $M_1, M_2 \ldots, M_k$ are linearly independent}

Definition: Let M1,M2,…,MkM_1, M_2, \ldots, M_k be subspaces of VV.

i) M=M1+M2+…+Mkii) M1,M2,…,Mk are linearly independent \begin{align*} &\text{i) $M = M_1 + M_2 + \ldots + M_k$} \\ &\text{ii) $M_1, M_2, \ldots, M_k$ are linearly independent} \end{align*}

Then, MM is called the direct sum of M1,M2,…,MkM_1, M_2, \ldots, M_k and we write M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k.

When we have a direct sum, summation of dimensions of subspaces is equal to the dimension of the direct sum.


Example: V=R4V = \mathbb{R}^4   x=[x1x2x3x4]∈R4x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{bmatrix} \in \mathbb{R}^4

M1={x∈R4∣x3=x4=0}M_1 = \{x \in \mathbb{R}^4 \mid x_3 = x_4 = 0\} dimension of M1M_1 is 2
M2={x∈R4∣x1=x2=0}M_2 = \{x \in \mathbb{R}^4 \mid x_1 = x_2 = 0\} dimension of M2M_2 is 2
M3={x∈R4∣x1=0}M_3 = \{x \in \mathbb{R}^4 \mid x_1 = 0\} dimension of M3M_3 is 3


Definition: If M=VM = V, then M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k is called a direct sum decomposition of VV.

Remark: Let M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k be a direct sum decomposition of VV. Then, the decomposition is unique. Proof: Let M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k and M=M1ˉ⊕M2ˉ⊕…⊕MkˉM = \bar{M_1} \oplus \bar{M_2} \oplus \ldots \oplus \bar{M_k} be two direct sum decompositions of VV. Then,

i) M=M1+M2+…+Mkii) M1,M2,…,Mk are linearly independentiii) M=M1ˉ+M2ˉ+…+Mkˉiv) M1ˉ,M2ˉ,…,Mkˉ are linearly independent \begin{align*} &\text{i) $M = M_1 + M_2 + \ldots + M_k$} \\ &\text{ii) $M_1, M_2, \ldots, M_k$ are linearly independent} \\ &\text{iii) $M = \bar{M_1} + \bar{M_2} + \ldots + \bar{M_k}$} \\ &\text{iv) $\bar{M_1}, \bar{M_2}, \ldots, \bar{M_k}$ are linearly independent} \end{align*}

Let x∈M1∩M1ˉx \in M_1 \cap \bar{M_1}, then x∈M1x \in M_1 and x∈M1ˉx \in \bar{M_1}. Since M1⊕M1ˉM_1 \oplus \bar{M_1}, we have x=0x = 0.
Let x∈M2∩M2ˉx \in M_2 \cap \bar{M_2}, then x∈M2x \in M_2 and x∈M2ˉx \in \bar{M_2}. Since M2⊕M2ˉM_2 \oplus \bar{M_2}, we have x=0x = 0.
â‹®\vdots
Let x∈Mk∩Mkˉx \in M_k \cap \bar{M_k}, then x∈Mkx \in M_k and x∈Mkˉx \in \bar{M_k}. Since Mk⊕MkˉM_k \oplus \bar{M_k}, we have x=0x = 0.
Since x=0x = 0 for all x∈M1∩M1ˉ,M2∩M2ˉ,…,Mk∩Mkˉx \in M_1 \cap \bar{M_1}, M_2 \cap \bar{M_2}, \ldots, M_k \cap \bar{M_k}, we have M1=M1ˉ,M2=M2ˉ,…,Mk=MkˉM_1 = \bar{M_1}, M_2 = \bar{M_2}, \ldots, M_k = \bar{M_k}.

Definition: Let VV be an inner product space and let M1,M2,…,MkM_1, M_2, \ldots, M_k be subspaces of VV. The orthogonal sum of the subspaces MM is defined as

<m1,m2>=0∀ m1∈M1 and m2∈M2<m_1, m_2> = 0 \quad \forall\text{ $m_1 \in M_1$ and $m_2 \in M_2$}

Orthogonality is denoted by M1⊥M2M_1 \perp M_2.


Definition: Let M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k and let M1⊥M2⊥…⊥MkM_1 \perp M_2 \perp \ldots \perp M_k. Then, MM is called the orthogonal direct sum of M1,M2,…,MkM_1, M_2, \ldots, M_k and we write M=M1⊥⊕M2⊥⊕…⊥⊕MkM = M_1 \substack{\perp \\ \oplus} M_2 \substack{\perp \\ \oplus} \ldots \substack{\perp \\ \oplus} M_k.


Definition: Let M=M1⊕M2⊕…⊕MkM = M_1 \oplus M_2 \oplus \ldots \oplus M_k be a direct sum decomposition of VV. Then, the orthogonal complement of MiM_i is defined as

Mi⊥={x∈V∣<x,mi>=0∀ mi∈Mi}M_i^{\perp} = \{x \in V \mid <x, m_i> = 0 \quad \forall\text{ $m_i \in M_i$}\}

Theorem: Mi⊥M_i^{\perp} is a subspace of VV.
Proof: x,y∈M⊥x,y \in M^{\perp} then αx+βy∈M⊥\alpha x + \beta y \in M^{\perp} should be shown.

<αx+βy,mi>=α<x,mi>+β<y,mi>=0 <\alpha x + \beta y, m_i> = \alpha<x, m_i> + \beta<y, m_i> = 0

for all mi∈Mim_i \in M_i. Therefore, αx+βy∈Mi⊥\alpha x + \beta y \in M_i^{\perp}.


Example: V=R3V = \mathbb{R}^3   M=span{[0−11],[−101]}M = \text{span}\left\{\begin{bmatrix} 0 \\ -1 \\ 1 \end{bmatrix}, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix} \right\}   M⊥=?M^{\perp} = ?

Solution: Let x=[x1x2x3]∈M⊥x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} \in M^{\perp}

<x,m1>=0<x, m_1> = 0   <x,m2>=0<x, m_2> = 0

<x,[0−11]>=0<x, \begin{bmatrix} 0 \\ -1 \\ 1 \end{bmatrix}> = 0
x2−x3=0x_2 - x_3 = 0
x2=x3x_2 = x_3

<x,[−101]>=0<x, \begin{bmatrix} -1 \\ 0 \\ 1 \end{bmatrix}> = 0
−x1+x3=0-x_1 + x_3 = 0
x1=x3x_1 = x_3

x=[x1x2x3]=[x3x3x3]=x3[111]x = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \end{bmatrix} = \begin{bmatrix} x_3 \\ x_3 \\ x_3 \end{bmatrix} = x_3 \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}

M⊥=span{[111]}M^{\perp} = \text{span}\left\{\begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}\right\}


Theorem: Let VV be an inner product space and let MM be a subspace of VV. Then, we can always write M⊕M⊥=VM \oplus M^{\perp} = V. That is VV can always be written as the direct sum of a subspace and its orthogonal complement.

Proof: We need to show two things:

  1. MM and M⊥M^{\perp} are linearly independent.
  2. Any x∈Vx \in V can be written as x=m+m⊥x = m + m^{\perp} where m∈Mm \in M and m⊥∈M⊥m^{\perp} \in M^{\perp}.

See lecture notes for the full proof.


#EE501 - Linear Systems Theory at METU